Stochastic Zeroth-order Optimization in High Dimensions
نویسندگان
چکیده
We consider the problem of optimizing a high-dimensional convex function using stochastic zeroth-order queries. Under sparsity assumptions on the gradients or function values, we present two algorithms: a successive component/feature selection algorithm and a noisy mirror descent algorithm using Lasso gradient estimates, and show that both algorithms have convergence rates that depend only logarithmically on the ambient dimension of the problem. Empirical results confirm our theoretical findings and show that the algorithms we design outperform classical zeroth-order optimization methods in the high-dimensional setting.
منابع مشابه
Zeroth-order Asynchronous Doubly Stochastic Algorithm with Variance Reduction
Zeroth-order (derivative-free) optimization attracts a lot of attention in machine learning, because explicit gradient calculations may be computationally expensive or infeasible. To handle large scale problems both in volume and dimension, recently asynchronous doubly stochastic zeroth-order algorithms were proposed. The convergence rate of existing asynchronous doubly stochastic zeroth order ...
متن کاملOn Zeroth-Order Stochastic Convex Optimization via Random Walks
We propose a method for zeroth order stochastic convex optimization that attains the suboptimality rate of Õ(n7T−1/2) after T queries for a convex bounded function f : R → R. The method is based on a random walk (the Ball Walk) on the epigraph of the function. The randomized approach circumvents the problem of gradient estimation, and appears to be less sensitive to noisy function evaluations c...
متن کاملA Comprehensive Linear Speedup Analysis for Asynchronous Stochastic Parallel Optimization from Zeroth-Order to First-Order
Asynchronous parallel optimization received substantial successes and extensive attention recently. One of core theoretical questions is how much speedup (or benefit) the asynchronous parallelization can bring to us. This paper provides a comprehensive and generic analysis to study the speedup property for a broad range of asynchronous parallel stochastic algorithms from the zeroth order to the...
متن کاملSufficient conditions on the zeroth-order general Randic index for maximally edge-connected digraphs
Let D be a digraph with vertex set V(D) .For vertex v V(D), the degree of v, denoted by d(v), is defined as the minimum value if its out-degree and its in-degree . Now let D be a digraph with minimum degree and edge-connectivity If is real number, then the zeroth-order general Randic index is defined by . A digraph is maximally edge-connected if . In this paper we present sufficient condi...
متن کاملPortfolio Optimization under Local-Stochastic Volatility: Coefficient Taylor Series Approximations and Implied Sharpe Ratio
We study the finite horizonMerton portfolio optimization problem in a general local-stochastic volatility setting. Using model coefficient expansion techniques, we derive approximations for the both the value function and the optimal investment strategy. We also analyze the ‘implied Sharpe ratio’ and derive a series approximation for this quantity. The zeroth-order approximation of the value fu...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1710.10551 شماره
صفحات -
تاریخ انتشار 2017